Humanoid robot AMI

AMI (아미), Artificial intelligence & Multimedia Innovative, is a humanoid robot invented by Hyun Seung Yang. AMI was designed as an innovative robot in 2001 that is able to act like a human; AMI can communicate with human freely and express various emotions through a monitor on its chest. AMI can use visual recognition to extract 3D objects information and also has both ultrasonic sensor and infrared ray sensor to avoid obstacles when moving. [1]

Contents

Inventor

Hyun Seung Yang, a professor at Korean Advanced Institute of Science and Technology (KAIST), invented AMI in KAIST Artificial Intelligence & Multimedia Lab. Yang earned his bachelors degree in Electronics Engineering from Seoul National University in 1976. He also earned M.S.E.E and Ph.D degrees from the School of Electrical Engineering at Purdue University in 1983 and 1986. He has been in charge of Department of Computer Science at KAIST as a professor since 1988. Korea Science and Engineering Foundation (KOSEF) has been a sponsor of his Computer Vision and Intelligent Robotics Lab, a center for Artificial Intelligence Research since 1990. He was a president of the Society of Artificial Intelligence Research of Korea Information Science Society from 1991 to 1993. From then on, he has been researching about humanoid robotics, computer vision, interactive media art and multimedia [2]

Development of AMI

AMI was developed and invented by Hyun Seung Yang and his Artificial Intelligence Research (AIR) (“Artificial Intelligence and Media Lab”).Their goal was to make a robot whose looks resemble a human being and also is able to communicate effectively with a human by expressing its emotional state. After three years of researching and developing, Hyun Seung Yang and his Lab team succeeded their goal to make a humanoid robot (AMI) that only cost 300 million won (approximately 300,000 dollars) to develop. AMI has a human looking face to detect 3D objects with its visual sensors, two arms to be able to grab an object, a monitor on its chest that shows how AMI feels every second, and wheel based bottom that allows AMI to move. AMI has total thirty two of Degrees of Freedom (DOF) which allows AMI to move smoothly.[1] Degrees of Freedom (DOF) are independent position variables that represent number of ways robot’s arm moves.[3] AMI has seven DOF on its head, five DOF on each arm, six DOF on each hand, one DOF on its waist, and two DOF on its vehicle. AMI is appealed to consumers as oppose to other robots regarding the facts that its size and appearance resembles human and it is projected to have relatively low cost to buy.[1]

History of robots made in Korea

AMI was the initiation of humanoid robot that its functions were influenced from former robots invented by Hyun Seung Yang and his AIR team. “CAIR-1” is the earliest model of robot that has its emotional function and is also able to move. “Kkumdori robot” (꿈돌이로봇), invented in 1993, was displayed in Daejun EXPO and received honorable mention from audiences and experts. “CAIR-2” won the international moving robot competition (국제이동로봇경진대회) in 1995. On the basis of technologies used in previous robots, Hyun Seung Yang and his AIR team improved intellectual and movable platform further to develop computer hardware and intellectual & emotional software in order to control the upper body part, both hands, and the head effectively.[1]

Specifications

AMI is capable of doing various activities such as communicating with a human by expressing its emotion and proper gestures or extracting 3D information through its technical devices. It has two CCD cameras, eighteen ultrasonic sensors, twelve infrared sensors, six integrated circuit pressure sensors, and a wireless microphone to recognize 3D objects. More importantly, AMI can avoid obstacles by recognizing reflections of ultrasonic and infrared rays; eighteen ultrasonic sensors and twelve infrared sensors attached to AMI shoot ultrasonic and infrared rays to the direction where AMI is heading and receive information about whether or not there is an obstacle in front of AMI in awareness. AMI has superior abilities in Human Robot Interaction (HRI). Because AMI has total thirty two Degrees of Freedom (DOF), AMI can handle objects with its hands freely. When AMI communicates with a human, its voice recognition system helps AMI to figure out fifty different words. As a reaction, AMI proposes proper reaction and gestures and express its feeling through a monitor on its chest.[1]

Operation of AMI

AMI can operate itself; however, wearable computing H/W allows one to control AMI. Wearable computing H/W is designed to be less weighted and comfortable for the user, to last longer than an hour, and to be able to connect to neighborhood sensors. For the wearable system control unit, USB is used as the interface between devices attached to AMI and the user so that he can effectively connect a variety of inputs and outputs into the devices. In order to help convenient communications between the user and electronic media, IWAS is used so that the user can control the electronic devices easily even without having detailed knowledge about the devices. IWAS is composed of several devices: a microphone, Force Sensing Register (FSR), and MI-A330LS. The microphone allows the user to input user’s speech commands. FSR is concerned with touch input. FSR sensors are designed to measure the pressure on the spot when the user touches the suit. MI-A330LS is a three-axis postural sensing unit for gesture detection; it calculates roll, pitch, and yaw (three rotation angles) in three-axis with three different accelerations on each direction. Interactive Electronic Media (IEM) also plays a role in this interaction between user and the electronic devices. IEM is electronic media that automatically figures out how to respond depending on the user’s emotional state. For example, a curtain with IEM system is able to sense its user’s emotional condition that reacts differently such as control of light in the house. As a result of these various technical systems, wearable computing H/W Platform (IWAS) capacitates a user several ability to control AMI effectively; a user wearing the wearable computing H/W can see images sent from AMI in real time; a user can hear sound that AMI recognize through a headphone; a user can command AMI through microphone in the master system and also can force AMI to pose or move the way the user wants.[4]

Communication

AMI was considered very innovative in 2001 because of its ability to communicate. When Hyun Seung Yang designed and developed AMY with his AIR team, he set a list of goals for effective communication of AMI.

First, robot must have social personality to induce interactions, and must first approach people to initiate interactions. Secondly, to enable more robust emotional interactions, robot has to change conversational topic based not only on its own emotions but also on user’s emotions. Thirdly, robot must keep memories of previous interactions with people, to lead conversation more naturally based on previous events and emotions. Lastly, robot has to continue guessing the user’s response based on the current context. Further, robot must not only lead interactions, but also communicate people with multimodality[2]

These following goals for AMI were established not only to communicate successfully but also to make the robot to lead interactions with a human. In order to achieve the goals, Hyun Seung Yang with his AIR team used sub-system consisting of five categories: a perception, motivation, memory, behavior, and expression system. The perception system is simply an emotion recognizer. The rest of the sub-systems are intended to decide appropriate reactions depending on user’s emotional condition with its own drive, emotion, and memory. These sub-systems for AMI were designed to determine apt behavior and reaction of AMI when communicating with a human by recognizing his emotional state [2] . In spite of the sub-systems, AMI has lack of understanding of natural dialogue due to limit of current technology. Therefore, Hyun Seung Yang and his AIR team came up with an idea to prevent AMI and a human talking at cross purposes. AMI was designed to ask user’s intention ahead of him and lead the conversation so that AMI could be prepared how to react in certain arrangement of topic.[2]

Emotion

The significance of AMI is that it does not only communicate with a human but also expresses its emotional state through facial expressions, voice, gestures and postures. AMI has the expression system consisting of three classes: dialogue expression system, 3D facial expression system, and gesture expression system. The dialogue expression system performs behavior of AMI and also expresses AMI’s emotion through face and gestures. 3D facial expression system illustrates inner emotional state of AMI through a monitor on its chest. AMI has three types of emotions: anger, sadness, and happiness.[1] Through the monitor on AMI’s chest, it shows one of the three emotions. Lastly, gesture expression was created to make realistic conversation and to make the user to think AMI as a human-looking and friendly robot.[2]

Conclusion

AMI, a humanoid robot invented by Hyun Seung Yang, was considered as an amazingly innovative robot when it first came out in 2001. The distinguishing feature of AMI was neither its ability to move and act like any other robots but was the ability to communicate and express its feeling through nonverbal communication. Based on previous technologies used for former successful robots such as “KAIR-2” and “Kkumdori robot,” Hyun Seung Yang and his AIR team had succeeded to design and develop several devices that support AMI in a variety of different ways. By inventing wearable computing W/B, also known as IWAS, they have made possible for a user to control AMI through sub-systems. For communication, they tried to overcome the limit of current technology that stops AMI from having natural dialogues with a human due to developing a new strategy; AMI leads conversation with a user to ask intention of the user ahead of him so that AMI can determine the arrangement of reaction it can choose. Moreover, AMI is able to express its own feeling through its monitor and gestures. By far, the qualities of robotic technologies are continuously increasing as a result of the advancement in technology of robots. Therefore, we should be aware that a humanoid robot like AMI can be adapted to on our daily basis in the future.

References

  1. ^ a b c d e f "아미/AMI". Robohouse. http://www.roboman.co.kr/bbs/board.php?bo_table=02_4&wr_id=33. Retrieved 30 Oct 2011. 
  2. ^ a b c d e Hye, Won Jung; Yong Ho Seo, Ryoo M Shangwon, and Yang Hung Seung (10). "Affective Communication System with Multimodality for a Humanoid Robot". Humanoid Robots, 2004 4th IEEE/RAS International Conference on 2. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1442679. Retrieved 30 Oct 2011. 
  3. ^ Tesar, Delbert. "Robotics Research Group". Degree of Freedom. Department of Energy-University Research Program in Robotics. http://www.robotics.utexas.edu/rrg/learn_more/low_ed/dof/. Retrieved 30 Oct 2011. 
  4. ^ Yang, Hyun Seung. [mind.kaist.ac.kr "Artificial Intelligence and Media Lab"]. Ubiwear Computing Research. mind.kaist.ac.kr. Retrieved 30-Oct-2011. 

External links